posterior collapse
An issue primarily related to [[variational autoencoder]] models, where the output becomes deterministic in a supposedly probabilistic model.
Some annealing schemes, such as [[cyclical-annealing-schedule-a-simple-approach-to-mitigating-kl-vanishing]] is developed to help VAEs learn to encode good latent variables and good accuracy in the outputs.
[variational autoencoder]: variational autoencoder "variational autoencoder"
Backlinks
variational autoencoder
Problem with VAEs is that they are susceptible to [[posterior collapse]], where a single output is produced regardless: this is when the decoder ignores the latent variable completely, and becomes a deterministic model.
variational autoencoder
- [[beta-VAE]] provides a tuning parameter, $\beta$, that attempts to force disentangling of the latent vector into unit Gaussian priors. Relatively easy to understand and implement, however quite readily suffers from [[posterior collapse]].